Web Survey Bibliography
It is generally accepted that conducting surveys online is both faster and cheaper than other, more traditional, survey methodologies. Such advantages have helped to drive the growth of online surveys over recent years. In addition to the growth of online surveys, respondents are increasingly required to answer more personal and sensitive questions online. It is therefore important to investigate and understand the behaviour of respondents to sensitive questions in surveys in order to ensure the most effective methodology is employed.
A salient issue in online survey research is the removal of an interviewer. This is particularly relevant when dealing with sensitive topics - when the lack of interviewer presence can remove response bias. Much research has demonstrated that surveys administered online, without an interviewer being present, are characterised by higher levels of self disclosure (Weisband and Kiesler 1996), an increased willingness to answer sensitive questions (Tourangeau 2004) and reductions in socially desirable responding (Frick et al. 2001; Joinson 1999). Furthermore, survey methodologies that reduce the level of question administration by human interviewers (e.g. via computer-aided self interviews) also increase responses to sensitive personal questions and yield more honest, candid answers.
As part of the ongoing experimental work at Ipsos MORI we are investigating the affect of different survey methodologies on respondents’ behaviour to sensitive questions.
In the present paper we present a two part study. Part 1 searches evidence of survey mode effect on disclosure levels and examines data consisting of participants interviewed in one of three conditions. In condition one, 1,645 members of the Ipsos Online Panel completed an online survey. In condition two, 902 were interviewed offline, face-to-face using Computer Assisted Personal Interviewing (CAPI) interviewing. Finally, in condition three, 1028 participants were again interviewed offline, using Computer Assisted Self Interviewing (CASI). Direct comparison were possible between the two offline samples. Allocation to the online sample, on the other hand, was not randomized thus propensity score adjustment was applied to control for possible confounding of online/offline comparisons. Respondents were asked more than 50 questions about a variety of topics from politics to media consumption. Within these questions respondents were asked five which were deemed as sensitive. The topics for the sensitive items covered: immigration, adultery, drink driving, abortion, and attitudes toward debt.
Part 2 examined the association between the level of sensitivity and level of disclosure, and specifically any differences between the three survey modes. To estimate the social sensitivity an ad hoc panel of five experienced independent social researchers sampled from a larger pool of experts and were asked to rank levels of sensitivity of each of the five questions. After passing reliability tests of agreement between raters the estimated sensitivity was correlated with item disclosure level by mode.
Finally, implications for the handling of sensitive questions in survey research are discussed.
Web survey bibliography (281)
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- Standard Definitions Final Dispositions of Case Codes and Outcome Rates for Surveys; 2016
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Pitfalls, Potentials, and Ethics of Online Survey Research: LGBTQ and Other Marginalized and Hard-to...; 2016; McInroy, L. B.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- A Statistical Approach to Provide Individualized Privacy for Surveys; 2016; Esponda, F.; Huerta, K.; Guerrero, V. M.
- Social Media Analyses for Social Measurement; 2016; Schober, M. F.; Pasek, J.; Guggenheim, L.; Lampe, C.; Conrad, F. G.
- Doing Surveys Online ; 2016; Toepoel, V.
- An Overview of Mobile CATI Issues in Europe; 2015; Slavec, A.; Toninelli, D.
- Utilizing iPads in the Field; 2015; Kiser, P.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2015; 2015
- The Web Survey Revolution ; 2015; Murray, D.
- Methodology of the RAND Mid-Term 2014 Election Panel; 2015; Carman, K. G; Pollack, S.
- 28 Questions to Help Buyers of Online Samples; 2015; Cape, P. J.; Phillips, A.; Baker, R.; Cooke, M.; Ribeiro, E.; Terhanian, G.
- Ethical decision-making and Internet research 2.0: Recommendations from the AoIR ethics working committee...; 2015; Markham, A.; Buchanan, E. A.
- Doing online research involving university students with disabilities: Methodological issues; 2015; De Cesarei, A.; Baldaro, B.
- Exploring ethical issues associated with using online surveys in educational research; 2015; Roberts, L. D.; Allen, P. J.
- An Introduction to Survey Research; 2015; Cowles, E. L.; Nelson, E.
- Ethical issues in online research; 2015; James, N.; Busher, H.
- Leading Edge Insights: Foundations of Quality 2.0; 2014; Fuguitt, G.
- Methods and systems for managing an online opinion survey service; 2014; Mcloughlin, M. H., Seton, N., Blesy, K.
- Recent Books and Journals in Public Opinion, Survey Methods, and Survey Statistics; 2014; Callegaro, M.
- Undisclosed Privacy: The Effect of Privacy Rights Design on Response Rates; 2014; Haer, R., Meidert, N.
- Tailoring mode of data collection in longitudinal studies; 2013; Kaminska, O., Lynn, P.
- How do we Know Cognitive Interviewing is Any Good?; 2013; Willis, G. B.
- Quality of Web surveys; 2013; Revilla, M.
- Experiments in Obtaining Data Linkage Consent in Web Surveys ; 2013; Sakshaug, J. W., Kreuter, F.
- Response Burden in Official Business Surveys: Measurement and Reduction Practices of National Statistical...; 2013; Giesen, D., Bavdaz, M., Loefgren, T., Raymond-Blaess, V.
- Internet as a new source of information for the production of official statistics. Experiences of Statistics...; 2013; Heerschap, N.
- A standard with quality indicators for web panel surveys: a Swedish example; 2013; Nyfjaell, M.
- How Mobile Stacks Up to Traditional Online: A Comparison of Studies; 2013; Knowles, R.
- How to make your questionnaire mobile-ready; 2013; Cape, P. J.
- Phish Rising: How Internet Criminals are Undermining the Viability of Online Survey Research…and...; 2013; Kunovic, K.
- Self-Reported Participation in Research Practices Among Survey Methodology Researchers; 2013; Perez-Vergara, K., Smith, C., Lowenstein, C., Ozonoff, A., Martins, Y.
- Ethics, privacy and data security in web-based course evaluation; 2013; Salaschek, M., Meese, C., Thielsch, M.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Code Comparison; 2012
- Evaluation procedures for Survey questions; 2012; Saris, W. E.
- Transparency, Access and the Credibility of Survey Research; 2012; Lupia, A.
- Anonymity and Confidentiality; 2012; Tourangeau, R.
- Cognitive Evaluation of Survey Instruments: State of the Science (Art?) and Future Directions; 2012; Willis, G. B.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- Comparability of Survey Measurements; 2012; Oberski, D.
- Classification of Surveys; 2012; Stoop, I., Harrison, E.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- Why one should incorporate the design weights when adjusting for unit nonresponse using response homogeneity...; 2012; Kott, P. S.
- Assessing the Quality of Survey Data ; 2012; Blasius, J.
- Designing and Doing Survey Research; 2012; Andres, L.
- Using break-offs in web interviews for predicting web response in mixed mode surveys; 2011; Beukenhorst, D.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2011; 2011